Cloud Operations - Logging

Introduction to cloud operations services

Cloud Operations is the stack of services that helps in monitoring, logging, debugging, and remove latencies from the APIs or the google cloud services. We have already seen the Monitoring service from the cloud operations stack. In this lesson, we will understand the rest of the services.

Logging#

Cloud Logging is the centralized log server for all the resources hosted on Google Cloud. All types of logs including the network and access logs are captured and stored in the logging service.

How does this work? What is the architecture that supports this type of centralized logging mechanism? Let’s have a look at Cloud Logging and its components.

Audit logs
<b>Audit logs</b>
Service Logs
<b>Service Logs</b>
App Logs
<b>App Logs</b>
syslog
<b>syslog</b>
Platform Logs
<b>Platform Logs</b>
Cloud Loggin API
<b><font style="font-size: 15px">Cloud Loggin API</font></b>
Log Router
<b>Log Router</b>
Exclusion Filters
<b>Exclusion Filters</b>
Log sinks - Inclusion filters
<b>Log sinks - Inclusion filters</b>
Anywhere
[Not supported by viewer]
Splunk
<b>Splunk</b>
sumo logic
<b>sumo logic</b>
stack
<b>stack</b>
(Log management softwares)
(Log management softwares)
Log       Storage
[Not supported by viewer]
Cloud   Storage
[Not supported by viewer]
BigQuery
[Not supported by viewer]
Pub/Sub
[Not supported by viewer]

  •  Log Search & analysis.
  • Log based metrics.
  • Log error analysis.
  • Dashboards from logs
  • Alerting from logs

[Not supported by viewer]
Cloud Logging API

The architecture is pretty simple and straightforward. All types of logs of resources are forwarded to the “Cloud Logging API”. At the backend of the logging API, the “Log router” determines if a log should be stored, excluded, or exported based on the filters applied to it.

You can exclude the log from storing and export it to the event processing system at the same time because of parallel filters. “Audit logs” and “Network logs” are the most frequent types of logs which are exported to the long-term storage classes in cloud storage buckets.

Log viewer#

To access the log viewer, open the main menu > Cloud Operations > Logging > logs viewer.

  • Logs viewer provides the interface to query logs to see the logs of the specific resource.

  • You can use the select menu options to filter out resources or directly write string queries.

Log viewer and logs of all available services.
Log viewer and logs of all available services.

Logs dashboard#

Logs dashboard is useful to have at a glance view of all the services. It shows the logs by severity levels. There are different severity levels of logs. The most common are:

  • Default
  • Error
  • Info
  • Notice
  • Debug

You can also download the charts for any presentations or for any other use.

Logs-based metrics#

This window shows any predefined or user-defined logs metrics. These metrics count the number of specific logs falling under certain filters created for logs. We can also create alerts based on these metrics.

Log router#

Log router is used to create log sinks to the particular export locations. As shown in the diagram you can export logs to the different tools using the cloud router.

To create the sink click on the Create Sink button and fill the form.

Creating log sink.
Creating log sink.

Log storage#

This window shows the location and bucket details of the log storage. Log storage is charged $0.5/GB once you exceeded the 50GB/project free quota.

Audit logs are stored in a separate bucket and have a maximum retention period of 400 Days. For the rest of the logs, the retention period is 30 days. So, after 30 days your logs will be no longer accessible.

You can create an alert for that so that once you exceed the 50 GB mark you can apply exclusions to not store the irrelevant logs.

Find out the Quotas and Limits for logging here.

Cloud PubSub

Cloud Operations - Other Services